Goto

Collaborating Authors

 custom handler


Serving ML Models with TorchServe

#artificialintelligence

This post will walk you through a process of serving your deep learning Torch model with the TorchServe framework. There are quite a bit of articles about this topic. However, typically they are focused either on deploying TorchServe itself or on writing custom handlers and getting the end results. That was a motivation for me to write this post. It covers both parts and gives end-to-end example.


Bootstrap your own Handler: How and why to create custom handlers for PyTorch's TorchServe

#artificialintelligence

TorchServe is a great tool to deploy trained PyTorch models, there is no denying that. But, as with any relatively new project, it is still creating a community around it to help with the more niche aspects of its implementation. As part of this community, we can contribute to this. So today, we will be discussing how to develop advanced custom handlers with PyTorch's TorchServe. We will also be reviewing the process of saving your PyTorch model with torch-model-archiver and how to include all the new artifacts created while we are at it.